Dimension-free Information Concentration via Exp-Concavity

نویسندگان

  • Ya-Ping Hsieh
  • Volkan Cevher
چکیده

Information concentration of probability measures have important implications in learning theory. Recently, it is discovered that the information content of a log-concave distribution concentrates around their differential entropy, albeit with an unpleasant dependence on the ambient dimension. In this work, we prove that if the potentials of the log-concave distribution are exp-concave, which is a central notion for fast rates in online and statistical learning, then the concentration of information can be further improved to depend only on the exp-concavity parameter, and hence, it can be dimension independent. Central to our proof is a novel yet simple application of the variance Brascamp-Lieb inequality. In the context of learning theory, our concentration-of-information result immediately implies high-probability results to many of the previous bounds that only hold in expectation.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Concentration for Infinitely Divisible Vectors with Independent Components

For various classes of Lipschitz functions we provide dimension free concentration inequalities for infinitely divisible random vectors with independent components and finite exponential moments. The purpose of this note is to further visit the concentration phenomenon for infinitely divisible vectors with independent components in an attempt to obtain dimension free concentration. Let X ∼ ID(γ...

متن کامل

Evaluation of the concavity depth and inclination in jaws using CBCT

Introduction: Nowadays, using implants as a choice in patient's treatment plans has become popular. The aim of this study was to determine the prevalence of mandibular lingual and maxillary buccal concavity, mean concavity depth and angle and its relation to age and gender. Materials and Methods: In 200 CBCT, concavity depth and angle were measured in 2 mm superior to the inferior alveolar c...

متن کامل

Exp-Concavity of Proper Composite Losses

The goal of online prediction with expert advice is to find a decision strategy which will perform almost as well as the best expert in a given pool of experts, on any sequence of outcomes. This problem has been widely studied and O( √ T ) and O(log T ) regret bounds can be achieved for convex losses (Zinkevich (2003)) and strictly convex losses with bounded first and second derivatives (Hazan ...

متن کامل

Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures

Sufficient conditions are developed, under which the compound Poisson distribution has maximal entropy within a natural class of probability measures on the nonnegative integers. Recently, one of the authors [O. Johnson, Stoch. Proc. Appl., 2007] used a semigroup approach to show that the Poisson has maximal entropy among all ultra-log-concave distributions with fixed mean. We show via a non-tr...

متن کامل

About Approximations of Exponentials

We look for the approximation of exp(A1 + A2) by a product in form exp(x1A1) exp(y1A2) · · · exp(xnA1) exp(ynA2). We specially are interested in minimal approximations, with respect to the number of terms. After having shown some isomorphisms between specific free Lie subalgebras, we will prove the equivalence of the search of such approximations and approximations of exp(A1 + · · ·+ An). The m...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1802.09301  شماره 

صفحات  -

تاریخ انتشار 2018